Bounds on optimal transport maps onto log-concave measures

نویسندگان

چکیده

We consider strictly log-concave measures, whose bounds degenerate at infinity. prove that the optimal transport map from Gaussian onto such a measure is locally Lipschitz, and eigenvalues of its Jacobian have controlled growth

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Near-Optimal Sample Complexity Bounds for Maximum Likelihood Estimation of Multivariate Log-concave Densities

We study the problem of learning multivariate log-concave densities with respect to a global loss function. We obtain the first upper bound on the sample complexity of the maximum likelihood estimator (MLE) for a log-concave density on R, for all d ≥ 4. Prior to this work, no finite sample upper bound was known for this estimator in more than 3 dimensions. In more detail, we prove that for any ...

متن کامل

Valuations on Log-Concave Functions

A classification of SL(n) and translation covariant Minkowski valuations on log-concave functions is established. The moment vector and the recently introduced level set body of log-concave functions are characterized. Furthermore, analogs of the Euler characteristic and volume are characterized as SL(n) and translation invariant valuations on log-concave functions. 2000 AMS subject classificat...

متن کامل

Small ball probability estimates for log-concave measures

We establish a small ball probability inequality for isotropic log-concave probability measures: there exist absolute constants c1, c2 > 0 such that if X is an isotropic log-concave random vector in R with ψ2 constant bounded by b and if A is a non-zero n × n matrix, then for every ε ∈ (0, c1) and y ∈ R, P (‖Ax− y‖2 6 ε‖A‖HS) 6 ε ` c2 b ‖A‖HS ‖A‖op ́2 , where c1, c2 > 0 are absolute constants.

متن کامل

Functional Inequalities for Gaussian and Log-Concave Probability Measures

We give three proofs of a functional inequality for the standard Gaussian measure originally due to William Beckner. The first uses the central limit theorem and a tensorial property of the inequality. The second uses the Ornstein-Uhlenbeck semigroup, and the third uses the heat semigroup. These latter two proofs yield a more general inequality than the one Beckner originally proved. We then ge...

متن کامل

Optimal Concentration of Information Content For Log-Concave Densities

An elementary proof is provided of sharp bounds for the varentropy of random vectors with log-concave densities, as well as for deviations of the information content from its mean. These bounds significantly improve on the bounds obtained by Bobkov and Madiman (Ann. Probab., 39(4):1528–1543, 2011). Mathematics Subject Classification (2010). Primary 52A40; Secondary 60E15, 94A17.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Differential Equations

سال: 2021

ISSN: ['1090-2732', '0022-0396']

DOI: https://doi.org/10.1016/j.jde.2020.09.032